A fast algorithm for training support vector regression via smoothed primal function minimization
نویسنده
چکیده
The support vector regression (SVR) model is usually fitted by solving a quadratic programming problem, which is computationally expensive. To improve the computational efficiency, we propose to directly minimize the objective function in the primal form. However, the loss function used by SVR is not differentiable, which prevents the well-developed gradient based optimization methods from being applicable. As such, we introduce a smooth function to approximate the original loss function in the primal form of SVR, which transforms the original quadratic programming into a convex unconstrained minimization problem. The properties of the proposed smoothed objective function are discussed and we prove that the solution of the smoothly approximated model converges to the original SVR solution. A conjugate gradient algorithm is designed for minimizing the proposed smoothly approximated objective function in a sequential minimization manner. ExtenDepartment of Mathematics, Missouri State University, Springfield, MO 65897 E-mail: [email protected]
منابع مشابه
Recursive Finite Newton Algorithm for Support Vector Regression in the Primal
Some algorithms in the primal have been recently proposed for training support vector machines. This letter follows those studies and develops a recursive finite Newton algorithm (IHLF-SVR-RFN) for training nonlinear support vector regression. The insensitive Huber loss function and the computation of the Newton step are discussed in detail. Comparisons with LIBSVM 2.82 show that the proposed a...
متن کاملTraining Lagrangian twin support vector regression via unconstrained convex minimization
In this paper, a new unconstrained convex minimization problem formulation is proposed as the Lagrangian dual of the 2-norm twin support vector regression (TSVR). The proposed formulation leads to two smaller sized unconstrained minimization problems having their objective functions piece-wise quadratic and differentiable. It is further proposed to apply gradient based iterative method for solv...
متن کاملAn Epsilon Hierarchical Fuzzy Twin Support Vector Regression
—The research presents -hierarchical fuzzy twin support vector regression (-HFTSVR) based on -fuzzy twin support vector regression (-FTSVR) and -twin support vector regression (-TSVR). -FTSVR is achieved by incorporating trapezoidal fuzzy numbers to -TSVR which takes care of uncertainty existing in forecasting problems. -FTSVR determines a pair of -insensitive proximal functions by so...
متن کاملSparse algorithm for robust LSSVM in primal space
As enjoying the closed form solution, least squares support vector machine (LSSVM) has been widely used for classification and regression problems having the comparable performance with other types of SVMs. However, LSSVM has two drawbacks: sensitive to outliers and lacking sparseness. Robust LSSVM (R-LSSVM) overcomes the first partly via nonconvex truncated loss function, but the current algor...
متن کاملExploiting Strong Convexity from Data with Primal-Dual First-Order Algorithms
We consider empirical risk minimization of linear predictors with convex loss functions. Such problems can be reformulated as convex-concave saddle point problems, and thus are well suitable for primal-dual first-order algorithms. However, primal-dual algorithms often require explicit strongly convex regularization in order to obtain fast linear convergence, and the required dual proximal mappi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Int. J. Machine Learning & Cybernetics
دوره 6 شماره
صفحات -
تاریخ انتشار 2015